首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   65289篇
  免费   8195篇
  国内免费   5936篇
电工技术   5667篇
技术理论   9篇
综合类   7172篇
化学工业   1419篇
金属工艺   907篇
机械仪表   4258篇
建筑科学   2392篇
矿业工程   1651篇
能源动力   817篇
轻工业   971篇
水利工程   1426篇
石油天然气   3276篇
武器工业   966篇
无线电   10522篇
一般工业技术   3097篇
冶金工业   1237篇
原子能技术   880篇
自动化技术   32753篇
  2024年   141篇
  2023年   825篇
  2022年   1740篇
  2021年   2207篇
  2020年   2339篇
  2019年   1805篇
  2018年   1626篇
  2017年   2071篇
  2016年   2327篇
  2015年   2757篇
  2014年   4533篇
  2013年   3998篇
  2012年   4921篇
  2011年   5251篇
  2010年   3999篇
  2009年   3943篇
  2008年   4464篇
  2007年   5078篇
  2006年   4321篇
  2005年   3981篇
  2004年   3389篇
  2003年   2879篇
  2002年   2160篇
  2001年   1641篇
  2000年   1394篇
  1999年   1028篇
  1998年   791篇
  1997年   674篇
  1996年   536篇
  1995年   497篇
  1994年   400篇
  1993年   276篇
  1992年   192篇
  1991年   201篇
  1990年   150篇
  1989年   126篇
  1988年   100篇
  1987年   74篇
  1986年   69篇
  1985年   98篇
  1984年   61篇
  1983年   73篇
  1982年   63篇
  1981年   40篇
  1980年   22篇
  1979年   35篇
  1978年   12篇
  1977年   20篇
  1976年   13篇
  1959年   9篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
81.
Estimation of longitudinal models of relationship status between all pairs of individuals (dyads) in social networks is challenging due to the complex inter-dependencies among observations and lengthy computation times. To reduce the computational burden of model estimation, a method is developed that subsamples the “always-null” dyads in which no relationships develop throughout the period of observation. The informative sampling process is accounted for by weighting the likelihood contributions of the observations by the inverses of the sampling probabilities. This weighted-likelihood estimation method is implemented using Bayesian computation and evaluated in terms of its bias, efficiency, and speed of computation under various settings. Comparisons are also made to a full information likelihood-based procedure that is only feasible to compute when limited follow-up observations are available. Calculations are performed on two real social networks of very different sizes. The easily computed weighted-likelihood procedure closely approximates the corresponding estimates for the full network, even when using low sub-sampling fractions. The fast computation times make the weighted-likelihood approach practical and able to be applied to networks of any size.  相似文献   
82.
The spectral overlap of color‐sampling filters increases errors when using a diagonal matrix transform, for color correction and reduces color distinction. Spectral sharpening is a transformation of colors that was introduced to reduce color‐constancy errors when the colors are collected through spectrally overlapping filters. The earlier color‐constancy methods improved color precision when the illuminant color is changed, but they overlooked the color distinction. In this article, we introduce a new spectral sharpening technique that has a good compromise of color precision and distinction, based on real physical constraints. The spectral overlap is measured through observing a gray reference chart with a set of real and spectrally disjoint filters selected by the user. The new sharpening method enables to sharpen colors obtained by a sensor without knowing the camera response functions. Experiments with real images showed that the colors sharpened by the new method have good levels of color precision and distinction as well. The color‐constancy performance is compared with the data‐based sharpening method in terms of both precision and distinction. © 2014 Wiley Periodicals, Inc. Col Res Appl, 40, 564–576, 2015  相似文献   
83.
为满足飞机疲劳试验对试验数据实时监控的需求,采用C#语言开发了一套飞机疲劳试验实时预警系统。该系统以疲劳试验数据管理系统为基础,计算各测量点、各工况的平均值和标准差,通过多种方法设定预警阈值。该系统界面友好、功能完备,可实现疲劳试验数据的实时监控,能够及时发现异常试验数据,尽早发现结构损伤并采取有效措施,可以大幅降低维修成本、缩短维修周期。  相似文献   
84.
This research establishes a methodological framework for quantifying community resilience based on fluctuations in a population''s activity during a natural disaster. Visits to points-of-interests (POIs) over time serve as a proxy for activities to capture the combined effects of perturbations in lifestyles, the built environment and the status of business. This study used digital trace data related to unique visits to POIs in the Houston metropolitan area during Hurricane Harvey in 2017. Resilience metrics in the form of systemic impact, duration of impact, and general resilience (GR) values were examined for the region along with their spatial distributions. The results show that certain categories, such as religious organizations and building material and supplies dealers had better resilience metrics—low systemic impact, short duration of impact, and high GR. Other categories such as medical facilities and entertainment had worse resilience metrics—high systemic impact, long duration of impact and low GR. Spatial analyses revealed that areas in the community with lower levels of resilience metrics also experienced extensive flooding. This insight demonstrates the validity of the approach proposed in this study for quantifying and analysing data for community resilience patterns using digital trace/location-intelligence data related to population activities. While this study focused on the Houston metropolitan area and only analysed one natural hazard, the same approach could be applied to other communities and disaster contexts. Such resilience metrics bring valuable insight into prioritizing resource allocation in the recovery process.  相似文献   
85.
The purpose of this study is to develop a modification of the model developed by Chen and Zhu in 2004. Calculating stage and overall efficiencies precisely and consistently has become a major challenge of the two‐stage DEA model. However, most other models do not calculate the optimality of intermediates. Although the model developed by Chen and Zhu measures the optimality of intermediates, the calculated efficiency scores still have some shortfalls. The modified model, named the hybrid two‐stage DEA model, fills the gap between calculating the optimality of intermediates and the consistency of overall efficiency scores. In addition to obtaining an accurate measurement for the optimality of intermediates, the model confines efficiency scores to a range from zero to one (a ratio efficiency score). In an empirical evaluation, we use data from 64 medical manufacturing firms to test the performance of the hybrid model and offer recommendations for the industry.  相似文献   
86.
The solder paste printing (SPP) is a critical procedure in a surface mount technology (SMT) based assembly line, which is one of the major attributes to the defect of the printed circuit boards (PCBs). The quality of SPP is influenced by multiple factors, such as the squeegee speed, pressure, the stencil separation speed, cleaning frequency, and cleaning profile. During printing, the printer environment is dynamically varying due to the physical change of solder paste, which can result in a dynamic variation of the relationships between the printing results and the influential factors. To reduce the printing defects, it is critical to understand such dynamic relationships. This research focuses on determining the printing performance during printing by implementing a wavelet filtering-based temporal recurrent neural network. To reduce the noise factor in the solder paste inspection (SPI) data, this research applies a three-dimensional dual-tree complex wavelet transformation for low-pass noise filtering and signal reconstruction. A recurrent neural network is utilized to model the performance prediction with low noise interference. Both printing sequence and process setting information are considered in the proposed recurrent network model. The proposed approach is validated using practical dataset and compared with other commonly used data mining approaches. The results show that the proposed wavelet-based multi-dimensional temporal recurrent neural network can effectively predict the printing process performance and can be a high potential approach in reducing the defects and controlling cleaning frequency. The proposed model is expected to advance the current research in the application of smart manufacturing in surface mount technology.  相似文献   
87.
Today’s information technologies involve increasingly intelligent systems, which come at the cost of increasingly complex equipment. Modern monitoring systems collect multi-measuring-point and long-term data which make equipment health prediction a “big data” problem. It is difficult to extract information from such condition monitoring data to accurately estimate or predict health statuses. Deep learning is a powerful tool for big data processing that is widely utilized in image and speech recognition applications, and can also provide effective predictions in industrial processes. This paper proposes the Long Short-term Memory Integrating Principal Component Analysis based on Human Experience (HEPCA-LSTM), which uses operational time-series data for equipment health prognostics. Principal component analysis based on human experience is first conducted to extract condition parameters from the condition monitoring system. The long short-term memory (LSTM) framework is then constructed to predict the target status. Finally, a dynamic update of the prediction model with incoming data is performed at a certain interval to prevent any model misalignment caused by the drifting of relevant variables. The proposed model is validated on a practical case and found to outperform other prediction methods. It utilizes a powerful deep learning analysis method, the LSTM, to fully process big condition monitoring series data; it effectively extracts the features involved with human experience and takes dynamic updates into consideration.  相似文献   
88.
Based on the three-dimensional classic Chua circuit, a nonlinear circuit containing two flux-control memristors is designed. Due to the difference in the design of the characteristic equation of the two magnetron memristors, their position form a symmetrical structure with respect to the capacitor. The existence of chaotic properties is proved by analyzing the stability of the system, including Lyapunov exponent, equilibrium point, eigenvalue, Poincare map, power spectrum, bifurcation diagram et al. Theoretical analysis and numerical calculation show that this heterogeneous memristive model is a hyperchaotic five-dimensional nonlinear dynamical system and has a strong chaotic behavior. Then, the memristive system is applied to digital image and speech signal processing. The analysis of the key space, sensitivity of key parameters, and statistical character of encrypted scheme imply that this model can applied widely in multimedia information security.  相似文献   
89.
专利规避设计需从现有专利中遴选具有核心竞争力的目标专利,以提升技术起点。针对设计领域的海量专利信息,提出了专利质量多指标主客观综合评价模型。分析专利书目信息与专利质量存在的正相关关系,甄选存活期、权利项数、同族专利数、引证数和被引证数等书目信息指标,构建了专利质量多指标综合评价指标体系,提升了评价方法的科学性和可操作性;提出将德尔菲法、层次分析法和均方差决策法相结合的主客观赋权法,确定专利质量评价指标权重,既保证了评价的权威性,又减小了个人偏见与从众妥协等因素的影响;建立了基于质量评价模型遴选目标专利的过程模型,并通过应用实例验证了该模型对遴选具有核心竞争力专利的有效性。  相似文献   
90.
近年来,我国传统暴力犯罪与成年人犯罪呈下降态势,但是,犯罪案由层出不穷。为有效提升公安实践工作中犯罪预测能力,打击各类违法犯罪事件,本文针对犯罪数据,提出一种新型犯罪预测模型。利用密度聚类分析方法将犯罪数据分类,然后进行数据降维提取关键属性生成特征数据,继而对特征数据进行加权优化并采用机器学习的方式对特征数据进行学习,从而预测犯罪案由。实验结果表明,与传统方法相比,本文方法具有更好的预测效果,为公安实践工作中类似案件的侦破和预防,提供新的路径支撑。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号